47 research outputs found

    Nucleosynthesis Predictions and High-Precision Deuterium Measurements

    Full text link
    Two new high-precision measurements of the deuterium abundance from absorbers along the line of sight to the quasar PKS1937--1009 were presented. The absorbers have lower neutral hydrogen column densities (N(HI) ≈\approx 18\,cm−2^{-2}) than for previous high-precision measurements, boding well for further extensions of the sample due to the plenitude of low column density absorbers. The total high-precision sample now consists of 12 measurements with a weighted average deuterium abundance of D/H = 2.55±0.02×10−52.55\pm0.02\times10^{-5}. The sample does not favour a dipole similar to the one detected for the fine structure constant. The increased precision also calls for improved nucleosynthesis predictions. For that purpose we have updated the public AlterBBN code including new reactions, updated nuclear reaction rates, and the possibility of adding new physics such as dark matter. The standard Big Bang Nucleosynthesis prediction of D/H = 2.456±0.057×10−52.456\pm0.057\times10^{-5} is consistent with the observed value within 1.7 standard deviations.Comment: 10 pages, 5 figures, conference proceedings from VarCosmoFun 201

    Combining Planck with Large Scale Structure gives strong neutrino mass constraint

    Get PDF
    We present the strongest current cosmological upper limit on the sum of neutrino masses of < 0.18 (95% confidence). It is obtained by adding observations of the large-scale matter power spectrum from the WiggleZ Dark Energy Survey to observations of the cosmic microwave background data from the Planck surveyor, and measurements of the baryon acoustic oscillation scale. The limit is highly sensitive to the priors and assumptions about the neutrino scenario. We explore scenarios with neutrino masses close to the upper limit (degenerate masses), neutrino masses close to the lower limit where the hierarchy plays a role, and addition of massive or massless sterile species.Comment: 7 pages, 4 figures. Found bug in analysis which is fixed in v2. The resulting constraints on M_nu remain very strong. Additional info added on hierarch

    WiggleZ Dark Energy Survey: Cosmological neutrino mass constraint from blue high-redshift galaxies

    Get PDF
    The absolute neutrino mass scale is currently unknown, but can be constrained by cosmology. The WiggleZ high redshift, star-forming, and blue galaxy sample offers a complementary data set to previous surveys for performing these measurements, with potentially different systematics from nonlinear structure formation, redshift-space distortions, and galaxy bias. We obtain a limit of ∑m_ν<0.60  eV (95% confidence) for WiggleZ+Wilkinson Microwave Anisotropy Probe. Combining with priors on the Hubble parameter and the baryon acoustic oscillation scale gives ∑m_ν<0.29  eV, which is the strongest neutrino mass constraint derived from spectroscopic galaxy redshift surveys

    Cosmological neutrino mass constraint from the WiggleZ Dark Energy Survey

    Get PDF
    The absolute neutrino mass scale is currently unknown, but can be constrained from cosmology. We use the large-scale structure information from the WiggleZ Dark Energy Survey to constrain the sum of neutrino masses. The WiggleZ high redshift star-forming blue galaxy sample is less sensitive to systematic effects from non-linear structure formation, pairwise galaxy velocities, redshift-space distortions, and galaxy bias than previous surveys. Through exhaustive tests using numerical dark-matter simulations of the WiggleZ survey, we demonstrate that at small scales common modelling approaches lead to systematic errors in the recovered cosmological parameters, and we use the simulations to calibrate a new non-linear fitting formula extending to small scales (k = 0.3hMpc). We obtain an upper limit on the sum of neutrino masses of 0.60eV (95% confidence) for WiggleZ+Wilkinson Microwave Anisotropy Probe. Combining with priors on the Hubble Parameter and the baryon acoustic oscillation scale gives an upper limit of 0.29eV, which is the strongest neutrino mass constraint derived from spectroscopic galaxy redshift surveys

    Sterile neutrinos in the Milky Way: Observational constraints

    Get PDF
    We consider the possibility of constraining decaying dark matter by looking out through the Milky Way halo. Specifically we use Chandra blank sky observations to constrain the parameter space of sterile neutrinos. We find that a broad band in parameter space is still open, leaving the sterile neutrino as an excellent dark matter candidate.Comment: Submitted to ApJL, 4 pages, 4 figure

    Mutual information estimation for graph convolutional neural networks

    Get PDF
    Measuring model performance is a key issue for deep learning practitioners. However, we often lack the ability to explain why a specific architecture attains superior predictive accuracy for a given data set. Often, validation accuracy is used as a performance heuristic quantifying how well a network generalizes to unseen data, but it does not capture anything about the information flow in the model. Mutual information can be used as a measure of the quality of internal representations in deep learning models, and the information plane may provide insights into whether the model exploits the available information in the data. The information plane has previously been explored for fully connected neural networks and convolutional architectures. We present an architecture-agnostic method for tracking a network's internal representations during training, which are then used to create the mutual information plane. The method is exemplified for graph-based neural networks fitted on citation data. We compare how the inductive bias introduced in graph-based architectures changes the mutual information plane relative to a fully connected neural network.Comment: Northern Lights Deep Learning proceedings, 8 pages, 3 figure

    A gradient boosting approach for optimal selection of bidding strategies in reservoir hydro

    Get PDF
    Power producers use a wide range of decision support systems to manage and plan for sales in the day-ahead electricity market, and they are often faced with the challenge of choosing the most advantageous bidding strategy for any given day. The optimal solution is not known until after spot clearing. Results from the models and strategy used, and their impact on profitability, can either continuously be registered, or simulated with use of historic data. Access to an increasing amount of data opens for the application of machine learning models to predict the best combination of models and strategy for any given day. In this article, historical performance of two given bidding strategies over several years have been analyzed with a combination of domain knowledge and machine learning techniques (gradient boosting and neural networks). A wide range of variables accessible to the models prior to bidding have been evaluated to predict the optimal strategy for a given day. Results indicate that a machine learning model can learn to slightly outperform a static strategy where one bidding method is chosen based on overall historic performance

    Neural Operator Learning for Long-Time Integration in Dynamical Systems with Recurrent Neural Networks

    Full text link
    Deep neural networks are an attractive alternative for simulating complex dynamical systems, as in comparison to traditional scientific computing methods, they offer reduced computational costs during inference and can be trained directly from observational data. Existing methods, however, cannot extrapolate accurately and are prone to error accumulation in long-time integration. Herein, we address this issue by combining neural operators with recurrent neural networks to construct a novel and effective architecture, resulting in superior accuracy compared to the state-of-the-art. The new hybrid model is based on operator learning while offering a recurrent structure to capture temporal dependencies. The integrated framework is shown to stabilize the solution and reduce error accumulation for both interpolation and extrapolation of the Korteweg-de Vries equation.Comment: 12 pages, 5 figure

    Pseudo-Hamiltonian neural networks with state-dependent external forces

    Get PDF
    Hybrid machine learning based on Hamiltonian formulations has recently been successfully demonstrated for simple mechanical systems, both energy conserving and not energy conserving. We introduce a pseudo-Hamiltonian formulation that is a generalization of the Hamiltonian formulation via the port-Hamiltonian formulation, and show that pseudo-Hamiltonian neural network models can be used to learn external forces acting on a system. We argue that this property is particularly useful when the external forces are state dependent, in which case it is the pseudo-Hamiltonian structure that facilitates the separation of internal and external forces. Numerical results are provided for a forced and damped mass–spring system and a tank system of higher complexity, and a symmetric fourth-order integration scheme is introduced for improved training on sparse and noisy data.publishedVersio

    Port-Hamiltonian Neural Networks with State-Dependent Ports

    Get PDF
    Hybrid machine learning based on Hamiltonian formulations has recently been successfully demonstrated for simple mechanical systems, both energy conserving and not energy conserving. We show that port-Hamiltonian neural network models can be used to learn external forces acting on a system. We argue that this property is particularly useful when the external forces are state dependent, in which case it is the port-Hamiltonian structure that facilitates the separation of internal and external forces. Numerical results are provided for a forced and damped mass-spring system and a tank system of higher complexity, and a symmetric fourth-order integration scheme is introduced for improved training on sparse and noisy data.Comment: 21 pages, 12 figures; v3: restructured the paper for more clarity, major changes to the text, updated plot
    corecore